6,610 research outputs found

    On the dialog between experimentalist and modeler in catchment hydrology

    Get PDF
    The dialog between experimentalist and modeler in catchment hydrology has been minimal to date. The experimentalist often has a highly detailed yet highly qualitative understanding of dominant runoff processes—thus there is often much more information content on the catchment than we use for calibration of a model. While modelers often appreciate the need for 'hard data' for the model calibration process, there has been little thought given to how modelers might access this 'soft' or process knowledge. We present a new method where soft data (i.e., qualitative knowledge from the experimentalist that cannot be used directly as exact numbers) are made useful through fuzzy measures of model-simulation and parameter-value acceptability. We developed a three-box lumped conceptual model for the Maimai catchment in New Zealand, a particularly well-studied process-hydrological research catchment. The boxes represent the key hydrological reservoirs that are known to have distinct groundwater dynamics, isotopic composition and solute chemistry. The model was calibrated against hard data (runoff and groundwater-levels) as well as a number of criteria derived from the soft data (e.g. percent new water, reservoir volume, etc). We achieved very good fits for the three-box model when optimizing the parameter values with only runoff (Reff=0.93). However, parameter sets obtained in this way showed in general a poor goodness-of-fit for other criteria such as the simulated new-water contributions to peak runoff. Inclusion of soft-data criteria in the model calibration process resulted in lower Reff-values (around 0.84 when including all criteria) but led to better overall performance, as interpreted by the experimentalist’s view of catchment runoff dynamics. The model performance with respect to soft data (like, for instance, the new water ratio) increased significantly and parameter uncertainty was reduced by 60% on average with the introduction of the soft data multi-criteria calibration. We argue that accepting lower model efficiencies for runoff is 'worth it' if one can develop a more 'real' model of catchment behavior. The use of soft data is an approach to formalize this exchange between experimentalist and modeler and to more fully utilize the information content from experimental catchments

    Hypervelocity impact microfoil perforations in the LEO space environment (LDEF, MAP AO-023 experiment)

    Get PDF
    The Microabrasion Foil Experiment comprises arrays of frames, each supporting two layers of closely spaced metallic foils and a back-stop plate. The arrays, deploying aluminum and brass foil ranging from 1.5 to some 30 microns were exposed for 5.78 years on NASA's LDEF at a mean altitude of 458 km. They were deployed on the North, South, East, West, and Space pointing faces; results presented comprise the perforation rates for each location as a function of foil thickness. Initial results refer primarily to aluminum of 5 microns thickness or greater. This penetration distribution, comprising 2,342 perforations in total, shows significantly differing characteristics for each detector face. The anisotropy confirms, incorporating the dynamics of particulate orbital mechanics, the dominance of incorporating extraterrestrial particulates penetrating thicknesses greater than 20 microns in Al foil, yielding fluxes compatible with hyperbolic geocentric velocities. For thinner foils, a disproportionate increase in flux of particles on the East, North, and South faces shows the presence of orbital particulates which exceed the extraterrestrial component perforation rate at 5 micron foil thickness by a factor of approx. 4

    Microscopic Description of Nuclear Fission: Fission Barrier Heights of Even-Even Actinides

    Full text link
    We evaluate the performance of modern nuclear energy density functionals for predicting inner and outer fission barrier heights and energies of fission isomers of even-even actinides. For isomer energies and outer barrier heights, we find that the self-consistent theory at the HFB level is capable of providing quantitative agreement with empirical data.Comment: 8 pages, 6 figures, 1 table; Proceedings of the 5th International Conference on "Fission and properties of neutron-rich nuclei" (ICFN5), Sanibel Island, Nov. 4-10, 201

    Intercomparison of soil pore water extraction methods for stable isotope analysis

    Get PDF
    Funded by NSERC Discovery Grant U.S. Forest Service U.S. Department of Energy's Office of Energy Efficiency and Renewable Energy, Bioenergy Technologies OfficePeer reviewedPostprin

    The in-situ cometary particulate size distribution measured for one comet: P/Halley

    Get PDF
    The close approach of Giotto to comet Halley during its 1986 apparition offered an opportunity to study the particulate mass distribution to masses of up to one gram. Data acquired by the front end channels of the highly sensitive mass spectrometer PIA and the dust shield detector system, DIDSY, provide definition to the detected distribution as close as 1000 km to the nucleus. Dynamic motion of the particulates after emission leads to a spatial differentiation affecting the size distribution in several forms: (1) ejecta velocity dispersion; (2) radiation pressure; (3) varying heliocentric distance; and (4) anisotropic nucleus emission. Transformation of the in-situ distribution from PIA and DIDSY weighted heavily by the near-nucleus fluxes leads to a presumed nucleus distribution. The data lead to a puzzling distribution at large masses, not readily explained in an otherwise monotonous power law distribution. Although temporal changes in nucleus activity could and do modify the in-situ size distribution, such an explanation is not wholly possible, because the same form is observed at differing locations in the coma where the time of flight from the nucleus greatly varies. Thus neither a general change in comet activity nor spatial variations lead to a satisfactory explanation

    Framework for Event-based Semidistributed Modeling that Unifies the SCS-CN Method, VIC, PDM, and TOPMODEL

    Get PDF
    Hydrologists and engineers may choose from a range of semidistributed rainfall-runoff models such as VIC, PDM, and TOPMODEL, all of which predict runoff from a distribution of watershed properties. However, these models are not easily compared to event-based data and are missing ready-to-use analytical expressions that are analogous to the SCS-CN method. The SCS-CN method is an event-based model that describes the runoff response with a rainfall-runoff curve that is a function of the cumulative storm rainfall and antecedent wetness condition. Here we develop an event-based probabilistic storage framework and distill semidistributed models into analytical, event-based expressions for describing the rainfall-runoff response. The event-based versions called VICx, PDMx, and TOPMODELx also are extended with a spatial description of the runoff concept of ‘‘prethreshold’’ and ‘‘threshold-excess’’ runoff, which occur, respectively, before and after infiltration exceeds a storage capacity threshold. For total storm rainfall and antecedent wetness conditions, the resulting ready-to-use analytical expressions define the source areas (fraction of the watershed) that produce runoff by each mechanism. They also define the probability density function (PDF) representing the spatial variability of runoff depths that are cumulative values for the storm duration, and the average unit area runoff, which describes the so-called runoff curve. These new event-based semidistributed models and the traditional SCS-CN method are unified by the same general expression for the runoff curve. Since the general runoff curve may incorporate different model distributions, it may ease the way for relating such distributions to land use, climate, topography, ecology, geology, and other characteristics
    corecore